Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
1.
2nd International Conference on Electronic Systems and Intelligent Computing, ESIC 2021 ; 860:313-327, 2022.
Article in English | Scopus | ID: covidwho-1919736

ABSTRACT

Recently, the individuals are under lockdown and limited mobility due to the random spreading of the COVID-19, i.e., coronavirus disease - 2019, worldwide as well as pandemic declared by the World Health Organization (WHO). RT-PCR, i.e., reverse transcriptase-polymerase chain reaction, tests that can detect the RNA from nasopharyngeal swabs have become the norm to allow people to travel within the nation and also to international destinations. This test is people-intensive, i.e., it involves a person collecting the sample, needs transportation with strict precautionary measures, and a lab technician to perform the test which may take up to 2 days to get the results. There is a lot of inconvenience to the people due to this process. Alternatively, X-Ray images have been used primarily by physicians to detect COVID-19 and its severity. Detection of COVID-19 through X-Ray can act as a safe, faster, and alternative method to RT-PCR tests. This method uses a Convolutional Neural Network (CNN) to classify the X-Ray scans into two categories, i.e., COVID-19 positive and negative. In this paper, a novel method named COVIHunt: an intelligent CNN-based COVID-19 detection technique using CXR imaging, is proposed for binary classification. From experiments, it is observed that the proposed work outperforms in comparison with other existing techniques. © 2022, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

2.
International Journal of Intelligent Systems and Applications in Engineering ; 10(2):252-259, 2022.
Article in English | Scopus | ID: covidwho-1898088

ABSTRACT

Since the Coronavirus (COVID19) pandemic, all activities have been held digitally, necessitating the surveillance of guests' real-time attendance. Previously, online attendance included getting the list of attendees, which was inconvenient because many people chose to keep silent or leave the meeting completely. As a result, a technique for collecting attendance using facial recognition that can correctly identify participants who remain online for the duration of the lecture is required. The goal of this work is to develop a system named IOSTS, an intelligent online attendance tracking system, that can track attendance while using minimum bandwidth and maintaining user privacy. This proposed work is based on the concepts of facial recognition and edge computing. The entire utility will be run on the client's PC. From random experiments, it is observed that achieving an accuracy of 98 % in facial recognition. This new approach is a foolproof method of tracking attendance and increasing digital transparency. © 2022, Ismail Saritas. All rights reserved.

3.
2nd International Conference on Intelligent and Cloud Computing, ICICC 2021 ; 286:3-15, 2022.
Article in English | Scopus | ID: covidwho-1826293

ABSTRACT

Recently, due to COVID-19 pandemic, the classes, seminars, meetings are scheduled on virtual platform. It is a need to keep track of the presence of attendees. Earlier online attendance involved extracting the list of attendees, which was inconvenient as a lot of people mute themselves and leave the meeting altogether. Therefore, a tool is required to capture attendance through facial recognition which can effectively identify the attendees who remain online for the whole duration of the lecture. In this paper, a method has been proposed to completely automate the attendance tracking system using the concept of edge computing. The tool runs alongside any video conference platform and tracks the faces of attendees in a random interval, and using face recognition technique, find out the people who remain present for the complete duration of the class. This novel method acts as a fail-proof method to monitor attendance and improve digital transparency. © 2022, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

SELECTION OF CITATIONS
SEARCH DETAIL